Ridge penalized logistical and ordinal partial least squares regression for predicting stroke deficit from infarct topography

نویسندگان

  • Jian Chen
  • Thanh G. Phan
  • David C. Reutens
چکیده

Improving the ability to assess potential stroke deficit may aid the selection of patients most likely to benefit from acute stroke therapies. Methods based only on ‘at risk’ volumes or initial neurological condition do predict eventual outcome but not perfectly. Given the close relationship between anatomy and function in the brain, we propose the use of a modified version of partial least squares (PLS) regression to examine how well stroke outcome covary with infarct location. The modified version of PLS incorporates penalized regression and can handle either binary or ordinal data. This version is known as partial least squares with penalized logistic regression (PLS-PLR) and has been adapted from its original use for high-dimensional microarray data. We have adapted this algorithm for use in imaging data and demonstrate the use of this algorithm in a set of patients with aphasia (high level language disorder) following stroke.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Pdmclass Function to Classify Microarray Data Using Penalized Discriminant Methods

Description This function is used to classify microarray data. Since the underlying model fit is based on penalized discriminant methods, there is no need for a pre-filtering step to reduce the number of genes. Usage pdmClass(formula , method = c("pls", "pcr", "ridge"), keep.fitted = Arguments formula A symbolic description of the model to be fit. Details given below. method One of "pls", "pcr"...

متن کامل

Classification using partial least squares with penalized logistic regression

MOTIVATION One important aspect of data-mining of microarray data is to discover the molecular variation among cancers. In microarray studies, the number n of samples is relatively small compared to the number p of genes per sample (usually in thousands). It is known that standard statistical methods in classification are efficient (i.e. in the present case, yield successful classifiers) partic...

متن کامل

Shrinkage structure in biased regression

Biased regression is an alternative to ordinary least squares (OLS) regression, especially when explanatory variables are highly correlated. In this paper, we examine the geometrical structure of the shrinkage factors of biased estimators. We show that, in most cases, shrinkage factors cannot belong to [0, 1] in all directions. We also compare the shrinkage factors of ridge regression (RR), pri...

متن کامل

Comparison of Ordinal Response Modeling Methods like Decision Trees, Ordinal Forest and L1 Penalized Continuation Ratio Regression in High Dimensional Data

Background: Response variables in most medical and health-related research have an ordinal nature. Conventional modeling methods assume predictor variables to be independent, and consider a large number of samples (n) compared to the number of covariates (p). Therefore, it is not possible to use conventional models for high dimensional genetic data in which p > n. The present study compared th...

متن کامل

Penalized Least Squares and Penalized Likelihood

where pλ(·) is the penalty function. Best subset selection corresponds to pλ(t) = (λ/2)I(t 6= 0). If we take pλ(t) = λ|t|, then (1.2) becomes the Lasso problem (1.1). Setting pλ(t) = at + (1 − a)|t| with 0 ≤ a ≤ 1 results in the method of elastic net. With pλ(t) = |t| for some 0 < q ≤ 2, it is called bridge regression, which includes the ridge regression as a special case when q = 2. Some penal...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010